Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Internal] Use Plugin Framework types internally in generated TF SDK structures #4291

Merged
merged 33 commits into from
Dec 10, 2024

Conversation

mgyucht
Copy link
Contributor

@mgyucht mgyucht commented Dec 4, 2024

Changes

Under some circumstances, the generated TFSDK structures may crash when trying to read the plan during create or update. This can happen specifically when the generated structure includes a plain Go type, such as a slice or map, and the corresponding attribute is either null or unknown in the plan. The issue is that the plugin framework does not have a way to represent these states in a plain slice or map, so it simply fails.

Terraform recommends using the plugin framework types for all fields, including lists, objects, and maps. This PR changes the generated structures to use types.List and types.Map for all lists, objects, and map attributes in all generated code.

Because these types do not include compile-time metadata about the type of the contained element, the contained element types are accessible at runtime through the addition of a GetComplexFieldTypes() method. This returns a map from resource field name (specifically, the value of the tfsdk tag) to the reflect.Type instance of the contained type. This must be either a primitive type from the plugin framework type system or a TF SDK struct type. In this PR, I added support for only 4 primitive types: types.String, types.Bool, types.Int64 and types.Float64.

Additional methods are also added via code generation (which accounts for most of the lines of code in this PR). They are:

  • Type(context.Context) attr.Type returns the Terraform type for the current object. This is always a basetypes.ObjectType. It should recursively call the same method on other TF SDK structures that are contained in list, map, or object fields.
  • ToObjectValue(context.Context) types.ObjectValue converts the TF SDK object to an types.ObjectValue instance. This makes it simpler to construct other attr.Values, such as lists and maps, from a TF SDK object.
  • Get...(context.Context) and Set...(context.Context, ...) are convenience getters and setters for list, map, and object fields within your structure.

GoSdkToTfSdkStruct, TfSdkToGoSdkStruct, and ResourceStructToSchema are all updated to handle the new structure of these TF SDK structs.

This PR does not change the default behavior of treating nested structs & pointers to structs as list blocks. However, it does add support for types.Object, so when we decide to change such fields to use this later, it should work as expected. Additionally, support for list attributes is not implemented here, but the change should be fairly easy (a small tweak in typeToSchema).

Note: I did need to make a manual change to one autogenerated file: the ComplianceSecurityProfile references ComplianceStandards from the settings package. This is an enum, so it should be treated as a string, but our current API spec doesn't provide enough metadata for our autogeneration templates to determine this, so it is treated as an object. This should be fixed after @renaudhartert-db's work to add duplicated structs in the API spec to eliminate cross-package dependencies.

Tests

Existing tests should continue to pass. Added a test case covering types.Object fields in conventions and ResourceStructToSchema.

I have been running the integration tests from a feature branch implementing the databricks_app resource, which has succeeded locally.

@mgyucht mgyucht requested review from a team as code owners December 4, 2024 09:07
@mgyucht mgyucht requested review from parthban-db and removed request for a team December 4, 2024 09:07
@mgyucht mgyucht temporarily deployed to test-trigger-is December 4, 2024 09:07 — with GitHub Actions Inactive
@mgyucht mgyucht temporarily deployed to test-trigger-is December 4, 2024 09:07 — with GitHub Actions Inactive
@mgyucht mgyucht temporarily deployed to test-trigger-is December 9, 2024 13:50 — with GitHub Actions Inactive
@mgyucht mgyucht temporarily deployed to test-trigger-is December 9, 2024 13:50 — with GitHub Actions Inactive
@mgyucht mgyucht requested review from rauchy and tanmay-db and removed request for parthban-db December 9, 2024 13:56
Copy link
Contributor

@rauchy rauchy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great to me! The tests and comments are really helpful. Awesome job!

SliceStructPtr types.List `tfsdk:"slice_struct_ptr" tf:"optional"`
Irrelevant types.String `tfsdk:"-"`
Object types.Object `tfsdk:"object" tf:"optional"`
ObjectPtr types.Object `tfsdk:"object_ptr" tf:"optional"`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove optional?

Copy link

If integration tests don't run automatically, an authorized user can run them manually by following the instructions below:

Trigger:
go/deco-tests-run/terraform

Inputs:

  • PR number: 4291
  • Commit SHA: 21c5917eecad5dfa082b3c230e873e7949ec5627

Checks will be approved automatically on success.

@eng-dev-ecosystem-bot
Copy link
Collaborator

Test Details: go/deco-tests/12259752305

@mgyucht mgyucht enabled auto-merge December 10, 2024 15:59
@mgyucht mgyucht added this pull request to the merge queue Dec 10, 2024
Merged via the queue into main with commit b1f0847 Dec 10, 2024
12 checks passed
@mgyucht mgyucht deleted the use-tftypes-everywhere branch December 10, 2024 17:21
mgyucht added a commit that referenced this pull request Dec 12, 2024
### New Features and Improvements

 * Add `databricks_app` resource and data source ([#4099](#4099)).

### Documentation

 * Add a warning that attribute should be used in `databricks_permissions` for `databricks_vector_search_endpoint` ([#4312](#4312)).

### Internal Changes

 * Added TF Plugin Framework checkbox item to PR template and removed checkbox for integration tests passing ([#4227](#4227)).
 * Expose several integration test helpers for use in plugin framework integration tests ([#4310](#4310)).
 * Fix ReadOnly() for ListNestedAttribute and Validators for ListNestedBlock ([#4313](#4313)).
 * Panic if the provided path is invalid ([#4309](#4309)).
 * Simplify `databricks_storage_credential` code ([#4301](#4301)).
 * Use Attributes by default for List Objects ([#4315](#4315)).
 * Use Plugin Framework types internally in generated TF SDK structures ([#4291](#4291)).
github-merge-queue bot pushed a commit that referenced this pull request Dec 12, 2024
### New Features and Improvements

* Add `databricks_app` resource and data source
([#4099](#4099)).


### Documentation

* Add a warning that attribute should be used in
`databricks_permissions` for `databricks_vector_search_endpoint`
([#4312](#4312)).


### Internal Changes

* Added TF Plugin Framework checkbox item to PR template and removed
checkbox for integration tests passing
([#4227](#4227)).
* Expose several integration test helpers for use in plugin framework
integration tests
([#4310](#4310)).
* Fix ReadOnly() for ListNestedAttribute and Validators for
ListNestedBlock
([#4313](#4313)).
* Panic if the provided path is invalid
([#4309](#4309)).
* Simplify `databricks_storage_credential` code
([#4301](#4301)).
* Use Attributes by default for List Objects
([#4315](#4315)).
* Use Plugin Framework types internally in generated TF SDK structures
([#4291](#4291)).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants